Identifying groups of strongly correlated variables through Smoothed Ordered Weighted `1-norms
نویسندگان
چکیده
In this article, we provide additional statements and proofs complementing the main paper. We present here the proofs of the statements given in the main paper. The section numbers in this document are arranged in correspondence to the respective sections in the main paper. 3 Related Work: OWL, OSCAR, and SLOPE 3.1 Proof of Proposition 3.1 Proof. 1. Let a = |w|. Let us assume WLOG that a1 ≥ · · · ≥ ad ≥ 0. Then the Lovász extension p(a) = ∑d i=1 aici, where ci = f(i)− f(i− 1) (See [1]). We get the result. 2. The derived penalty is non-decreasing since c ≥ 0. And since c forms a decreasing sequence, P is submodular. Hence the result. 4 SOWL Definition and Properties The below Lemma states that ΩS is a valid norm. Lemma 4.A. Let w ∈ R. ΩS(w) defined in (SOWL) is a valid norm if c1 + · · ·+ cd ≥ 0. Proof. From Proposition 3.1, we see that we can derive a submodular function P such that P (∅) = 0 and P (A) > 0 for A ⊆ 1, . . . , d. Now, ΩS is a special case of norms proposed in [2, Section 2], which are indeed valid norms. Proceedings of the 20 International Conference on Artificial Intelligence and Statistics (AISTATS) 2017, Fort Lauderdale, Florida, USA. JMLR: W&CP volume 54. Copyright 2017 by the author(s). The below statements show that for every c satisfying c1 ≥ · · · ≥ cd, there exists c̃ satisfying c̃1 ≥ · · · ≥ c̃d ≥ 0, such that ΩS is same for both c and c̃. Lemma 4.B. Let w ∈ R. Given c ∈ R such that c1 ≥ · · · ≥ cd. Let k be the minimum integer such that ck + · · ·+ cd ≥ 0. Then define c̃ ∈ R such that c̃i = ci for i = 1, . . . , k − 1, c̃k = ck + · · ·+ cd, and c̃i = 0 for i = k+ 1, . . . , d. Explicitly mentioning the dependency of ΩS on c as ΩS(w; c), we have ΩS(w; c) = ΩS(w; c̃). Proof. Let P be the submodular function constructed from c, and P̃ be the corresponding function constructed from c̃. From [2, Lemma 3], we see that both have the same Lower Combinatorial Envelope, which implies that ΩS(w; c) = ΩS(w; c̃). 4.1 Proof of Proposition 4.3 Proof. 1. Once we make the assumption on the lattice, the objective in (SOWL) is separable in terms of variables within each group Gj . And the result follows. 2. This candidate δw is optimal only if for small perturbations around ηw, the objective function increases. Let us denote by Γ(η) = ∑d i=1 ciη(i), the Lovász extension of P . From [1], we see that around ηw, we have the decomposition of Γ as Γ(η + dη) = Γ(η) + k ∑
منابع مشابه
Identifying Groups of Strongly Correlated Variables through Smoothed Ordered Weighted L1-norms
The failure of LASSO to identify groups of correlated predictors in linear regression has sparked significant research interest. Recently, various norms [1, 2] were proposed, which can be best described as instances of ordered weighted l1 norms (OWL) [3], as an alternative to l1 regularization used in LASSO. OWL can identify groups of correlated variables but it forces the model to be constant ...
متن کاملSparse Estimation with Strongly Correlated Variables using Ordered Weighted `1 Regularization
This paper studies ordered weighted `1 (OWL) norm regularization for sparse estimation problems with strongly correlated variables. We prove sufficient conditions for clustering based on the correlation/colinearity of variables using the OWL norm, of which the so-called OSCAR [4] is a particular case. Our results extend previous ones for OSCAR in several ways: for the squared error loss, our co...
متن کاملOrdered Weighted `1 Regularized Regression with Strongly Correlated Covariates: Theoretical Aspects
This paper studies the ordered weighted `1 (OWL) family of regularizers for sparse linear regression with strongly correlated covariates. We prove sufficient conditions for clustering correlated covariates, extending and qualitatively strengthening previous results for a particular member of the OWL family: OSCAR (octagonal shrinkage and clustering algorithm for regression). We derive error bou...
متن کاملRegularized Regression with Strongly Correlated Covariates: Theoretical Aspects
This paper studies the ordered weighted `
متن کاملOrdered Weighted L1 Regularized Regression with Strongly Correlated Covariates: Theoretical Aspects
This paper studies the ordered weighted `1 (OWL) family of regularizers for sparse linear regression with strongly correlated covariates. We prove sufficient conditions for clustering correlated covariates, extending and qualitatively strengthening previous results for a particular member of the OWL family: OSCAR (octagonal shrinkage and clustering algorithm for regression). We derive error bou...
متن کامل